# Multi-task pre-training
All Indo E5 Small V4
This is an Indonesian text embedding model based on sentence-transformers, capable of mapping sentences and paragraphs into a 384-dimensional dense vector space, suitable for tasks such as clustering and semantic search.
Text Embedding
Transformers

A
LazarusNLP
3,039
7
Incaselawbert
MIT
InCaseLawBERT is a BERT model pre-trained on Indian legal texts, focusing on natural language processing tasks related to Indian law.
Large Language Model
Transformers English

I
law-ai
546
19
Unispeech 1350 En 353 Fr Ft 1h
UniSpeech is a unified speech representation learning model that combines labeled and unlabeled data for pre-training, specifically fine-tuned for French.
Speech Recognition
Transformers French

U
microsoft
20
0
Unispeech 1350 En 90 It Ft 1h
UniSpeech is a unified speech representation learning model that combines supervised phoneme CTC learning and self-supervised learning, specifically fine-tuned for Italian.
Speech Recognition
Transformers Other

U
microsoft
19
0
Code Trans T5 Small Code Documentation Generation Go Multitask
Go language code documentation generation model based on T5-small architecture, supporting multi-task processing
Text Generation
C
SEBIS
17
0
Code Trans T5 Large Code Comment Generation Java Multitask Finetune
Pre-trained model based on T5-large architecture, specifically designed for generating Java function/method code comment documentation
Large Language Model
C
SEBIS
22
0
Code Trans T5 Large Code Documentation Generation Java Multitask
A Java code documentation generation model based on the T5 large architecture, supporting multi-task training and excelling in generating Java function descriptions
Large Language Model
C
SEBIS
13
0
Code Trans T5 Small Api Generation Multitask
A multi-task pre-trained model based on T5 small architecture, specializing in generating API usage suggestions for Java programming tasks
Large Language Model
C
SEBIS
25
1
Code Trans T5 Small Code Comment Generation Java Multitask
Java code comment generation model based on T5-small architecture, supporting multi-task training, suitable for automatic Java function documentation generation
Text Generation
C
SEBIS
17
0
Code Trans T5 Base Code Documentation Generation Java Multitask
A pre-trained model based on the T5 architecture, specifically designed for generating Java function documentation with multi-task processing support
Text Generation
C
SEBIS
57
1
Code Trans T5 Base Code Documentation Generation Python Multitask Finetune
A Python code documentation generation model based on T5 architecture, pre-trained and fine-tuned with multi-task learning, specifically designed for generating Python function documentation
Text Generation
C
SEBIS
26
1
Code Trans T5 Base Code Documentation Generation Go Multitask Finetune
A T5 architecture-based Go language code documentation generation model, pre-trained and fine-tuned with multi-task learning, specifically designed for generating documentation for Go functions/methods.
Text Generation
C
SEBIS
15
0
Code Trans T5 Base Code Comment Generation Java Multitask Finetune
A Java code comment generation model based on the T5 architecture, optimized through multi-task pre-training and fine-tuning, specifically designed to generate descriptive text for Java functions.
Large Language Model
C
SEBIS
16
0
Featured Recommended AI Models